Almost all societies through human history have been patriarchal; very few have been matriarchal. Some individuals try to blame it on Judeo/Christian teachings, but if one looks at other parts of the world and world religions patriarchal control is overwhelmingly dominant. Women and children were seen as property and treated as such. They had few, if any, rights. Women couldn't own property, if they weren't sufficiently submissive to their husbands they could be thrown into the insane asylum, and their only "career" choices were wife, teacher, prostitute. Children were seen as expendable: Put to work as young as age 5 in dangerous factory jobs, thrown in orphanages or kicked out of the home at a young age.
As others noted, it has only been in the last 100 years that hold has been loosened. The suffragettes were the trailblazers and both World Wars helped open jobs up to women, but my personal opinion is that the 60s and reliable birth control is what really brought the women's movement forward to women on the Supreme Court, running for president, and servicing as vice-president. Because women moved into more positions of authority during the past 100 years, children's rights were also brought to the forefront. Obviously, there is still a long way to go: The Equal Rights Amendment has never been ratified by the States. I do have concern that with our society becoming more and more polarized that we may go backward rather than forward, but I do remain optimistic.